home *** CD-ROM | disk | FTP | other *** search
- From prechelt@ira.uka.de (Lutz Prechelt)
- Newsgroups: comp.ai.neural-nets,comp.answers,news.answers
- Subject: FAQ in comp.ai.neural-nets -- monthly posting
- Date: 1 Jun 1993 07:35:06 GMT
-
- Archive-name: neural-net-faq
- Last-modified: 93/05/20
-
- (FAQ means "Frequently Asked Questions")
-
- ------------------------------------------------------------------------
- Anybody who is willing to contribute any question or
- information, please email me; if it is relevant,
- I will incorporate it. But: Please format your contribution
- appropriately so that I can just drop it in.
-
- The monthly posting departs at the 28th of every month.
- ------------------------------------------------------------------------
-
- This is a monthly posting to the Usenet newsgroup comp.ai.neural-nets
- (and news.answers, where it should be findable at ANY time)
- Its purpose is to provide basic information for individuals who are
- new to the field of neural networks or are just beginning to read this
- group. It shall help to avoid lengthy discussion of questions that usually
- arise for beginners of one or the other kind.
-
- >>>>> SO, PLEASE, SEARCH THIS POSTING FIRST IF YOU HAVE A QUESTION <<<<<
- and
- >>>>> DON'T POST ANSWERS TO FAQs: POINT THE ASKER TO THIS POSTING <<<<<
-
- This posting is archived in the periodic posting archive on
- host trfm.mit.edu (and on some other hosts as well).
- Look in the anonymous ftp directory "/pub/usenet/news.answers",
- the filename is as given in the 'Archive-name:' header above.
- If you do not have anonymous ftp access, you can access the archives
- by mail server as well. Send an E-mail message to
- mail-server@rtfm.mit.edu with "help" and "index" in the body on
- separate lines for more information.
-
-
- The monthly posting is not meant to discuss any topic exhaustively.
-
- Disclaimer: This posting is provided 'as is'.
- No warranty whatsoever is expressed or implied,
- especially, no warranty that the information contained herein
- is correct or useful in any way, although both is intended.
-
- >> To find the answer of question number <x> (if present at all), search
- >> for the string "-A<x>.)" (so the answer to question 12 is at "-A12.)")
-
- And now, in the end, we begin:
-
- ============================== Questions ==============================
-
- (the short forms and non-continous numbering is intended)
- 1.) What is this newsgroup for ? How shall it be used ?
- 2.) What is a neural network (NN) ?
- 3.) What can you do with a Neural Network and what not ?
- 4.) Who is concerned with Neural Networks ?
-
- 6.) What does 'backprop' mean ?
- 7.) How many learning methods for NNs exist ? Which ?
- 8.) What about Genetic Algorithms ?
- 9.) What about Fuzzy Logic ?
-
- 10.) Good introductory literature about Neural Networks ?
- 11.) Any journals and magazines about Neural Networks ?
- 12.) The most important conferences concerned with Neural Networks ?
- 13.) Neural Network Associations ?
- 14.) Other sources of information about NNs ?
-
- ============================== Answers ==============================
-
- ------------------------------------------------------------------------
-
- -A1.) What is this newsgroup for ?
-
- The newsgroup comp.ai.neural-nets is inteded as a forum for people who want
- to use or explore the capabilities of Neural Networks or Neural-Network-like
- structures.
-
- There should be the following types of articles in this newsgroup:
-
- 1. Requests
-
- Requests are articles of the form
- "I am looking for X"
- where X is something public like a book, an article, a piece of software.
-
- If multiple different answers can be expected, the person making the
- request should prepare to make a summary of the answers he/she got
- and announce to do so with a phrase like
- "Please email, I'll summarize"
- at the end of the posting.
-
- The Subject line of the posting should then be something like
- "Request: X"
-
- 2. Questions
-
- As opposed to requests, questions are concerned with something so specific
- that general interest cannot readily be assumed.
- If the poster thinks that the topic is of some general interest,
- he/she should announce a summary (see above).
-
- The Subject line of the posting should be something like
- "Question: this-and-that"
- or have the form of a question (i.e., end with a question mark)
-
- 3. Answers
-
- These are reactions to questions or requests.
- As a rule of thumb articles of type "answer" should be rare.
- Ideally, in most cases either the answer is too specific to be of general
- interest (and should thus be e-mailed to the poster) or a summary
- was announced with the question or request (and answers should
- thus be e-mailed to the poster).
-
- The subject lines of answers are automatically adjusted by the
- news software.
-
- 4. Summaries
-
- In all cases of requests or questions the answers for which can be assumed
- to be of some general interest, the poster of the request or question
- shall summarize the ansers he/she received.
- Such a summary should be announced in the original posting of the question
- or request with a phrase like
- "Please answer by email, I'll summarize"
-
- In such a case answers should NOT be posted to the newsgroup but instead
- be mailed to the poster who collects and reviews them.
- After about 10 to 20 days from the original posting, its poster should
- make the summary of answers and post it to the net.
-
- Some care should be invested into a summary:
- a) simple concatenation of all the answers is not enough;
- instead redundancies, irrelevancies, verbosities and
- errors must be filtered out (as good as possible),
- b) the answers shall be separated clearly
- c) the contributors of the individual answers shall be identifiable
- (unless they requested to remain anonymous [yes, that happens])
- d) the summary shall start with the "quintessence" of the answers,
- as seen by the original poster
- e) A summary should, when posted, clearly be indicated to be one
- by giving it a Subject line starting with "Summary:"
-
- Note that a good summary is pure gold for the rest of the newsgroup
- community, so summary work will be most appreciated by all of us.
- (Good summaries are more valuable than any moderator ! :-> )
-
- 5. Announcements
-
- Some articles never need any public reaction.
- These are called announcements (for instance for a workshop,
- conference or the availability of some technical report or
- software system).
-
- Announcements should be clearly indicated to be such by giving
- them a subject line of the form
- "Announcement: this-and-that"
-
- 6. Reports
-
- Sometimes people spontaneously want to report something to the
- newsgroup. This might be special experiences with some software,
- results of own experiments or conceptual work, or especially
- interesting information from somewhere else.
-
- Reports should be clearly indicated to be such by giving
- them a subject line of the form
- "Report: this-and-that"
-
- 7. Discussions
-
- An especially valuable possibility of Usenet is of course that of
- discussing a certain topic with hundreds of potential participants.
- All traffic in the newsgroup that can not be subsumed under one of
- the above categories should belong to a discussion.
-
- If somebody explicitly wants to start a discussion, he/she can do so
- by giving the posting a subject line of the form
- "Start discussion: this-and-that"
- (People who react on this, please remove the
- "Start discussion: " label from the subject line of your replies)
-
- It is quite difficult to keep a discussion from drifting into chaos,
- but, unfortunately, as many many other newsgroups show there seems
- to be no secure way to avoid this.
- On the other hand, comp.ai.neural-nets has not had many problems
- with this effect in the past, so let's just go and hope... :->
-
- ------------------------------------------------------------------------
-
- -A2.) What is a neural network (NN) ?
-
- [anybody there to write something better?
- buzzwords: artificial vs. natural/biological; units and
- connections; value passing; inputs and outputs; storage in structure
- and weights; only local information; highly parallel operation ]
-
- First of all, when we are talking about a neural network, we *should*
- usually better say "artificial neural network" (ANN), because that is
- what we mean most of the time. Biological neural networks are much
- more complicated in their elementary structures than the mathematical
- models we use for ANNs.
-
- A vague description is as follows:
-
- An ANN is a network of many very simple processors ("units"), each
- possibly having a (small amount of) local memory. The units are
- connected by unidirectional communication channels ("connections"),
- which carry numeric (as opposed to symbolic) data. The units operate
- only on their local data and on the inputs they receive via the
- connections.
-
- The design motivation is what distinguishes neural networks from other
- mathematical techniques:
-
- A neural network is a processing device, either an algorithm, or actual
- hardware, whose design was motivated by the design and functioning of human
- brains and components thereof.
-
- Most neural networks have some sort of "training" rule
- whereby the weights of connections are adjusted on the basis of
- presented patterns.
- In other words, neural networks "learn" from examples,
- just like children learn to recognize dogs from examples of dogs,
- and exhibit some structural capability for generalization.
-
- Neural networks normally have great potential for parallelism, since
- the computations of the components are independent of each other.
-
- ------------------------------------------------------------------------
-
- -A3.) What can you do with a Neural Network and what not ?
-
- [preliminary]
-
- In principle, NNs can compute any computable function, i.e. they can
- do everything a normal digital computer can do.
- Especially can anything that can be represented as a mapping between
- vector spaces be approximated to arbitrary precision by feedforward
- NNs (which is the most often used type).
-
- In practice, NNs are especially useful for mapping problems
- which are tolerant of a high error rate, have lots of example data
- available, but to which hard and fast rules can not easily be applied.
-
- NNs are especially bad for problems that are concerned with manipulation
- of symbols and for problems that need short-term memory.
-
- ------------------------------------------------------------------------
-
- -A4.) Who is concerned with Neural Networks ?
-
- Neural Networks are interesting for quite a lot of very dissimilar people:
-
- - Computer scientists want to find out about the properties of
- non-symbolic information processing with neural nets and about learning
- systems in general.
- - Engineers of many kinds want to exploit the capabilities of
- neural networks on many areas (e.g. signal processing) to solve
- their application problems.
- - Cognitive scientists view neural networks as a possible apparatus to
- describe models of thinking and conscience (High-level brain function).
- - Neuro-physiologists use neural networks to describe and explore
- medium-level brain function (e.g. memory, sensory system, motorics).
- - Physicists use neural networks to model phenomena in statistical
- mechanics and for a lot of other tasks.
- - Biologists use Neural Networks to interpret nucleotide sequences.
- - Philosophers and some other people may also be interested in
- Neural Networks for various reasons.
-
- ------------------------------------------------------------------------
-
- -A6.) What does 'backprop' mean ?
-
- [anybody to write something similarly short,
- but easier to understand for a beginner ? ]
-
- It is an abbreviation for 'backpropagation of error' which is the
- most widely used learning method for neural networks today.
- Although it has many disadvantages, which could be summarized in the
- sentence
- "You are almost not knowing what you are actually doing
- when using backpropagation" :-)
- it has pretty much success on practical applications and is
- relatively easy to apply.
-
- It is for the training of layered (i.e., nodes are grouped
- in layers) feedforward (i.e., the arcs joining nodes are
- unidirectional, and there are no cycles) nets.
-
- Back-propagation needs a teacher that knows the correct output for any
- input ("supervised learning") and uses gradient descent on the error
- (as provided by the teacher) to train the weights. The activation
- function is (usually) a sigmoidal (i.e., bounded above and below, but
- differentiable) function of a weighted sum of the nodes inputs.
-
- The use of a gradient descent algorithm to train its weights makes it
- slow to train; but being a feedforward algorithm, it is quite rapid during
- the recall phase.
-
- Literature:
- Rumelhart, D. E. and McClelland, J. L. (1986):
- Parallel Distributed Processing: Explorations in the
- Microstructure of Cognition (volume 1, pp 318-362).
- The MIT Press.
- (this is the classic one) or one of the dozens of other books
- or articles on backpropagation :->
-
- ------------------------------------------------------------------------
-
- -A7.) How many learning methods for NNs exist ? Which ?
-
- There are many many learning methods for NNs by now. Nobody can know
- exactly how many.
- New ones (at least variations of existing ones) are invented every
- week. Below is a collection of some of the most well known methods;
- not claiming to be complete.
-
- The main categorization of these methods is the distiction of
- supervised from unsupervised learning:
-
- - In supervised learning, there is a "teacher" who in the learning
- phase "tells" the net how well it performs ("reinforcement learning")
- or what the correct behavior would have been ("fully supervised learning").
-
- - In unsupervised learning the net is autonomous: it just looks at
- the data it is presented with, finds out about some of the
- properties of the data set and learns to reflect these properties
- in its output. What exactly these properties are, that the network
- can learn to recognise, depends on the particular network model and
- learning method.
-
- Many of these learning methods are closely connected with a certain
- (class of) network topology.
-
- Now here is the list, just giving some names:
-
- 1. UNSUPERVISED LEARNING (i.e. without a "teacher"):
- 1). Feedback Nets:
- a). Additive Grossberg (AG)
- b). Shunting Grossberg (SG)
- c). Binary Adaptive Resonance Theory (ART1)
- d). Analog Adaptive Resonance Theory (ART2, ART2a)
- e). Discrete Hopfield (DH)
- f). Continuous Hopfield (CH)
- g). Discrete Bidirectional Associative Memory (BAM)
- h). Temporal Associative Memory (TAM)
- i). Adaptive Bidirectional Associative Memory (ABAM)
- j). Kohonen Self-organizing Map (SOM)
- k). Kohonen Topology-preserving Map (TPM)
- 2). Feedforward-only Nets:
- a). Learning Matrix (LM)
- b). Driver-Reinforcement Learning (DR)
- c). Linear Associative Memory (LAM)
- d). Optimal Linear Associative Memory (OLAM)
- e). Sparse Distributed Associative Memory (SDM)
- f). Fuzzy Associative Memory (FAM)
- g). Counterprogation (CPN)
-
- 2. SUPERVISED LEARNING (i.e. with a "teacher"):
- 1). Feedback Nets:
- a). Brain-State-in-a-Box (BSB)
- b). Fuzzy Congitive Map (FCM)
- c). Boltzmann Machine (BM)
- d). Mean Field Annealing (MFT)
- e). Recurrent Cascade Correlation (RCC)
- f). Learning Vector Quantization (LVQ)
- 2). Feedforward-only Nets:
- a). Perceptron
- b). Adaline, Madaline
- c). Backpropagation (BP)
- d). Cauchy Machine (CM)
- e). Adaptive Heuristic Critic (AHC)
- f). Time Delay Neural Network (TDNN)
- g). Associative Reward Penalty (ARP)
- h). Avalanche Matched Filter (AMF)
- i). Backpercolation (Perc)
- j). Artmap
- k). Adaptive Logic Network (ALN)
- l). Cascade Correlation (CasCor)
-
- ------------------------------------------------------------------------
-
- -A8.) What about Genetic Algorithms ?
-
- [preliminary]
- [Who will write a better introduction?]
-
- There are a number of definitions of GA (Genetic Algorithm).
- A possible one is
-
- A GA is an optimization program
- that starts with some encoded procedure, (Creation of Life :-> )
- mutates it stochastically, (Get cancer or so :-> )
- and uses a selection process (Darwinism)
- to prefer the mutants with high fitness
- and perhaps a recombination process (Make babies :-> )
- to combine properties of (preferably) the succesful mutants.
-
- There is a newsgroup that is dedicated to Genetic Algorithms
- called comp.ai.genetic.
- Some GA discussion also tends to happen in comp.ai.neural-nets.
- Another loosely relevant group is comp.theory.self-org-sys.
- There is a GA mailing list which you can subscribe to by
- sending a request to GA-List-Request@AIC.NRL.NAVY.MIL
- You can also try anonymous ftp to
- ftp.aic.nrl.navy.mil
- in the /pub/galist directory. There are papers and some software.
-
- For more details see (for example):
-
- "Genetic Algorithms in Search Optimisation and Machine Learning"
- by David Goldberg (Addison-Wesley 1989, 0-201-15767-5) or
-
- "Handbook of Genetic Algorithms"
- edited by Lawrence Davis (Van Nostrand Reinhold 1991 0-442-00173-8) or
-
- "Classifier Systems and Genetic Algorithms"
- L.B. Booker, D.E. Goldberg and J.H. Holland, Techreport No. 8 (April 87),
- Cognitive Science and Machine Intelligence Laboratory, University of Michigan
- also reprinted in :
- Artificial Intelligence, Volume 40 (1989), pages 185-234
-
- ------------------------------------------------------------------------
-
- -A9.) What about Fuzzy Logic ?
-
- [preliminary]
- [Who will write an introduction?]
-
- Fuzzy Logic is an area of research based on the work of L.A. Zadeh.
- It is a departure from classical two-valued sets and logic, that uses
- "soft" linguistic (e.g. large, hot, tall) system variables and a
- continuous range of truth values in the interval [0,1], rather than
- strict binary (True or False) decisions and assignments.
- Fuzzy logic is used where a system is difficult to model, is
- controlled by a human operator or expert, or where ambiguity or
- vagueness is common. A typical fuzzy system consists of a rule base,
- membership functions, and an inference procedure.
-
- Most Fuzzy Logic discussion takes place in the newsgroup comp.ai.fuzzy,
- but there is also some work (and discussion) about combining fuzzy
- logic with Neural Network approaches in comp.ai.neural-nets.
-
- For more details see (for example):
-
- Klir, G.J. and Folger, T.A., Fuzzy Sets, Uncertainty, and
- Information, Prentice-Hall, Englewood
- Cliffs, N.J., 1988.
-
- Kosko, B., Neural Networks and Fuzzy Systems, Prentice Hall,
- Englewood Cliffs, NJ, 1992.
-
- ------------------------------------------------------------------------
-
- -A10.) Good introductory literature about Neural Networks ?
-
- 0.) The best (subjectively, of course -- please don't flame me):
-
- Hecht-Nielsen, R. (1990). Neurocomputing. Addison Wesley.
- Comments: "A good book", "comprises a nice historical overview and a chapter
- about NN hardware. Well structured prose. Makes important concepts clear."
-
- Hertz, J., Krogh, A., and Palmer, R. (1991). Introduction to the Theory of
- Neural Computation. Addison-Wesley: Redwood City, California.
- ISBN 0-201-50395-6 (hardbound) and 0-201-51560-1 (paperbound)
- Comments: "My first impression is that this one is by far the best book on
- the topic. And it's below $30 for the paperback."; "Well written, theoretical
- (but not overwhelming)"; It provides a good balance of model development,
- computational algorithms, and applications. The mathematical derivations
- are especially well done"; "Nice mathematical analysis on the mechanism of
- different learning algorithms"; "It is NOT for mathematical beginner.
- If you don't have a good grasp of higher level math, this book can
- be really tough to get through."
-
-
- 1.) Books for the beginner:
-
- Aleksander, I. and Morton, H. (1990). An Introduction to Neural Computing.
- Chapman and Hall. (ISBN 0-412-37780-2).
- Comments: "This book seems to be intended for the first year of university
- education."
-
- Beale, R. and Jackson, T. (1990). Neural Computing, an Introduction.
- Adam Hilger, IOP Publishing Ltd : Bristol. (ISBN 0-85274-262-2).
- Comments: "It's clearly written. Lots of hints as to how to get the
- adaptive models covered to work (not always well explained in the
- original sources). Consistent mathematical terminology. Covers
- perceptrons, error-backpropagation, Kohonen self-org model, Hopfield
- type models, ART, and associative memories."
-
- Dayhoff, J. E. (1990). Neural Network Architectures: An Introduction.
- Van Nostrand Reinhold: New York.
- Comments: "Like Wasserman's book, Dayhoff's book is also very easy to
- understand".
-
- McClelland, J. L. and Rumelhart, D. E. (1988).
- Explorations in Parallel Distributed Processing: Computational Models of
- Cognition and Perception (software manual). The MIT Press.
- Comments: "Written in a tutorial style, and includes 2 diskettes of NN
- simulation programs that can be compiled on MS-DOS or Unix (and they do
- too !)"; "The programs are pretty reasonable as an introduction to some
- of the things that NNs can do."; "There are *two* editions of this book.
- One comes with disks for the IBM PC, the other comes with disks for the
- Macintosh".
-
- McCord Nelson, M. and Illingworth, W.T. (1990). A Practical Guide to Neural
- Nets. Addison-Wesley Publishing Company, Inc. (ISBN 0-201-52376-0).
- Comments: "No formulas at all( ==> no good)"; "It does not have much
- detailed model development (very few equations), but it does present many
- areas of application. It includes a chapter on current areas of research.
- A variety of commercial applications is discussed in chapter 1. It also
- includes a program diskette with a fancy graphical interface (unlike the
- PDP diskette)".
-
- Orchard, G.A. & Phillips, W.A. (1991). Neural Computation: A
- Beginner's Guide. Lawrence Earlbaum Associates: London.
- Comments: "Short user-friendly introduction to the area, with a
- non-technical flavour. Apparently accompanies a software package, but I
- haven't seen that yet".
-
- Wasserman, P. D. (1989). Neural Computing: Theory & Practice.
- Van Nostrand Reinhold: New York. (ISBN 0-442-20743-3)
- Comments: "Wasserman flatly enumerates some common architectures from an
- engineer's perspective ('how it works') without ever addressing the underlying
- fundamentals ('why it works') - important basic concepts such as clustering,
- principal components or gradient descent are not treated. It's also full of
- errors, and unhelpful diagrams drawn with what appears to be PCB board layout
- software from the '70s. For anyone who wants to do active research in the
- field I consider it quite inadequate"; "Okay, but too shallow"; "Quite
- easy to understand";
- "The best bedtime reading for Neural Networks. I have given
- this book to numerous collegues who want to know NN basics, but who never
- plan to implement anything. An excellent book to give your manager."
-
-
- 2.) The classics:
-
- Kohonen, T. (1984). Self-organization and Associative Memory. Springer-Verlag:
- New York. (2nd Edition: 1988; 3rd edition: 1989).
- Comments: "The section on Pattern mathematics is excellent."
-
- Rumelhart, D. E. and McClelland, J. L. (1986). Parallel Distributed
- Processing: Explorations in the Microstructure of Cognition (volumes 1 & 2).
- The MIT Press.
- Comments: "As a computer scientist I found the two Rumelhart and McClelland
- books really heavy going and definitely not the sort of thing to read if you
- are a beginner."; "It's quite readable, and affordable (about $65 for both
- volumes)."; "THE Connectionist bible.".
-
-
- 3.) Introductory journal articles:
-
- Hinton, G. E. (1989). Connectionist learning procedures.
- Artificial Intelligence, Vol. 40, pp. 185--234.
- Comments: "One of the better neural networks overview papers, although the
- distinction between network topology and learning algorithm is not always
- very clear. Could very well be used as an introduction to neural networks."
-
- Knight, K. (1990). Connectionist, Ideas and Algorithms. Communications of
- the ACM. November 1990. Vol.33 nr.11, pp 59-74.
- Comments:"A good article, while it is for most people easy to find a copy of
- this journal."
-
- Kohonen, T. (1988). An Introduction to Neural Computing. Neural Networks,
- vol. 1, no. 1. pp. 3-16.
- Comments: "A general review".
-
-
- 4.) Not-quite-so-introductory literature:
-
- Anderson, J. A. and Rosenfeld, E. (Eds). (1988). Neurocomputing:
- Foundations of Research. The MIT Press: Cambridge, MA.
- Comments: "An expensive book, but excellent for reference. It is a
- collection of reprints of most of the major papers in the field.";
-
- Anderson, J. A., Pellionisz, A. and Rosenfeld, E. (Eds). (1990).
- Neurocomputing 2: Directions for Research. The MIT Press: Cambridge, MA.
- Comments: "The sequel to their well-known Neurocomputing book."
-
- Caudill, M. and Butler, C. (1990). Naturally Intelligent Systems.
- MIT Press: Cambridge, Massachusetts. (ISBN 0-262-03156-6).
- Comments: "I guess one of the best books I read"; "May not be suited for
- people who want to do some research in the area".
-
- Khanna, T. (1990). Foundations of Neural Networks. Addison-Wesley: New York.
- Comments: "Not so bad (with a page of erroneous formulas (if I remember
- well), and #hidden layers isn't well described)."; "Khanna's intention
- in writing his book with math analysis should be commended but he
- made several mistakes in the math part".
-
- Levine, D. S. (1990). Introduction to Neural and Cognitive Modeling.
- Lawrence Erlbaum: Hillsdale, N.J.
- Comments: "Highly recommended".
-
- Lippmann, R. P. (April 1987). An introduction to computing with neural nets.
- IEEE Acoustics, Speech, and Signal Processing Magazine. vol. 2,
- no. 4, pp 4-22.
- Comments: "Much acclaimed as an overview of neural networks, but rather
- inaccurate on several points. The categorization into binary and continuous-
- valued input neural networks is rather arbitrary, and may work confusing for
- the unexperienced reader. Not all networks discussed are of equal importance."
-
- Maren, A., Harston, C. and Pap, R., (1990). Handbook of Neural Computing
- Applications. Academic Press. ISBN: 0-12-471260-6. (451 pages)
- Comments: "They cover a broad area"; "Introductory with suggested
- applications implementation".
-
- Pao, Y. H. (1989). Adaptive Pattern Recognition and Neural Networks
- Addison-Wesley Publishing Company, Inc. (ISBN 0-201-12584-6)
- Comments: "An excellent book that ties together classical approaches
- to pattern recognition with Neural Nets. Most other NN books do not
- even mention conventional approaches."
-
- Rumelhart, D. E., Hinton, G. E. and Williams, R. J. (1986). Learning
- representations by back-propagating errors. Nature, vol 323 (9 October),
- pp. 533-536.
- Comments: "Gives a very good potted explanation of backprop NN's. It gives
- sufficient detail to write your own NN simulation."
-
- Simpson, P. K. (1990). Artificial Neural Systems: Foundations, Paradigms,
- Applications and Implementations. Pergamon Press: New York.
- Comments: "Contains a very useful 37 page bibliography. A large number of
- paradigms are presented. On the negative side the book is very shallow.
- Best used as a complement to other books".
-
- Zeidenberg. M. (1990). Neural Networks in Artificial Intelligence.
- Ellis Horwood, Ltd., Chichester.
- Comments: "Gives the AI point of view".
-
- Zornetzer, S. F., Davis, J. L. and Lau, C. (1990). An Introduction to
- Neural and Electronic Networks. Academic Press. (ISBN 0-12-781881-2)
- Comments: "Covers quite a broad range of topics (collection of
- articles/papers )."; "Provides a primer-like introduction and overview for
- a broad audience, and employs a strong interdisciplinary emphasis".
-
- ------------------------------------------------------------------------
-
- -A11.) Any journals and magazines about Neural Networks ?
-
-
- [to be added: comments on speed of reviewing and publishing,
- whether they accept TeX format or ASCII by e-mail, etc.]
-
- A. Dedicated Neural Network Journals:
- =====================================
-
- Title: Neural Networks
- Publish: Pergamon Press
- Address: Pergamon Journals Inc., Fairview Park, Elmsford,
- New York 10523, USA and Pergamon Journals Ltd.
- Headington Hill Hall, Oxford OX3, 0BW, England
- Freq.: 6 issues/year (vol. 1 in 1988)
- Cost/Yr: Free with INNS membership ($45?), Individual $65, Institution $175
- ISSN #: 0893-6080
- Remark: Official Journal of International Neural Network Society (INNS).
- Contains Original Contributions, Invited Review Articles, Letters
- to Editor, Invited Book Reviews, Editorials, Announcements and INNS
- News, Software Surveys. This is probably the most popular NN journal.
- (Note: Remarks supplied by Mike Plonski "plonski@aero.org")
- -------
- Title: Neural Computation
- Publish: MIT Press
- Address: MIT Press Journals, 55 Hayward Street Cambridge,
- MA 02142-9949, USA, Phone: (617) 253-2889
- Freq.: Quarterly (vol. 1 in 1989)
- Cost/Yr: Individual $45, Institution $90, Students $35; Add $9 Outside USA
- ISSN #: 0899-7667
- Remark: Combination of Reviews (10,000 words), Views (4,000 words)
- and Letters (2,000 words). I have found this journal to be of
- outstanding quality.
- (Note: Remarks supplied by Mike Plonski "plonski@aero.org")
- -----
- Title: IEEE Transaction on Neural Networks
- Publish: Institute of Electrical and Electronics Engineers (IEEE)
- Address: IEEE Service Cemter, 445 Hoes Lane, P.O. Box 1331, Piscataway, NJ,
- 08855-1331 USA. Tel: (201) 981-0060
- Cost/Yr: $10 for Members belonging to participating IEEE societies
- Freq.: Quarterly (vol. 1 in March 1990)
- Remark: Devoted to the science and technology of neural networks
- which disclose significant technical knowledge, exploratory
- developments and applications of neural networks from biology to
- software to hardware. Emphasis is on artificial neural networks.
- Specific aspects include self organizing systems, neurobiological
- connections, network dynamics and architecture, speech recognition,
- electronic and photonic implementation, robotics and controls.
- Includes Letters concerning new research results.
- (Note: Remarks are from journal announcement)
- -----
- Title: International Journal of Neural Systems
- Publish: World Scientific Publishing
- Address: USA: World Scientific Publishing Co., 687 Hartwell Street, Teaneck,
- NJ 07666. Tel: (201) 837-8858; Eurpoe: World Scientific Publishing
- Co. Pte. Ltd., 73 Lynton Mead, Totteridge, London N20-8DH, England.
- Tel: (01) 4462461; Other: World Scientific Publishing Co. Pte. Ltd.,
- Farrer Road, P.O. Box 128, Singapore 9128. Tel: 2786188
- Freq.: Quarterly (Vol. 1 in 1990?)
- Cost/Yr: Individual $42, Institution $88 (plus $9-$17 for postage)
- ISSN #: 0129-0657 (IJNS)
- Remark: The International Journal of Neural Systems is a quarterly journal
- which covers information processing in natural and artificial neural
- systems. It publishes original contributions on all aspects of this
- broad subject which involves physics, biology, psychology, computer
- science and engineering. Contributions include research papers,
- reviews and short communications. The journal presents a fresh
- undogmatic attitude towards this multidisciplinary field with the
- aim to be a forum for novel ideas and improved understanding of
- collective and cooperative phenomena with computational capabilities.
- (Note: Remarks supplied by B. Lautrup (editor),
- "LAUTRUP%nbivax.nbi.dk@CUNYVM.CUNY.EDU" )
- Review is reported to be very slow.
- ------
- Title: Neural Network News
- Publish: AIWeek Inc.
- Address: Neural Network News, 2555 Cumberland Parkway, Suite 299, Atlanta, GA
- 30339 USA. Tel: (404) 434-2187
- Freq.: Monthly (beginning September 1989)
- Cost/Yr: USA and Canada $249, Elsewhere $299
- Remark: Commericial Newsletter
- ------
- Title: Network: Computation in Neural Systems
- Publish: IOP Publishing Ltd
- Address: Europe: IOP Publishing Ltd, Techno House, Redcliffe Way, Bristol
- BS1 6NX, UK; IN USA: American Institute of Physics, Subscriber
- Services 500 Sunnyside Blvd., Woodbury, NY 11797-2999
- Freq.: Quarterly (1st issue 1990)
- Cost/Yr: USA: $180, Europe: 110 pounds
- Remark: Description: "a forum for integrating theoretical and experimental
- findings across relevant interdisciplinary boundaries." Contents:
- Submitted articles reviewed by two technical referees paper's
- interdisciplinary format and accessability." Also Viewpoints and
- Reviews commissioned by the editors, abstracts (with reviews) of
- articles published in other journals, and book reviews.
- Comment: While the price discourages me (my comments are based upon
- a free sample copy), I think that the journal succeeds very well. The
- highest density of interesting articles I have found in any journal.
- (Note: Remarks supplied by brandt kehoe "kehoe@csufres.CSUFresno.EDU")
- ------
- Title: Connection Science: Journal of Neural Computing,
- Artificial Intelligence and Cognitive Research
- Publish: Carfax Publishing
- Address: Europe: Carfax Publishing Company, P. O. Box 25, Abingdon,
- Oxfordshire OX14 3UE, UK. USA: Carafax Publishing Company,
- 85 Ash Street, Hopkinton, MA 01748
- Freq.: Quarterly (vol. 1 in 1989)
- Cost/Yr: Individual $82, Institution $184, Institution (U.K.) 74 pounds
- -----
- Title: International Journal of Neural Networks
- Publish: Learned Information
- Freq.: Quarterly (vol. 1 in 1989)
- Cost/Yr: 90 pounds
- ISSN #: 0954-9889
- Remark: The journal contains articles, a conference report (at least the
- issue I have), news and a calendar.
- (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")
- -----
- Title: Concepts in NeuroScience
- Publish: World Scientific Publishing
- Address: Same Address (?) as for International Journal of Neural Systems
- Freq.: Twice per year (vol. 1 in 1989)
- Remark: Mainly Review Articles(?)
- (Note: remarks by Osamu Saito "saito@nttica.NTT.JP")
- -----
- Title: International Journal of Neurocomputing
- Publish: ecn Neurocomputing GmbH
- Freq.: Quarterly (vol. 1 in 1989)
- Remark: Commercial journal, not the academic periodicals
- (Note: remarks by Osamu Saito "saito@nttica.NTT.JP")
- Review has been reported to be fast (less than 3 months)
- -----
- Title: Neurocomputers
- Publish: Gallifrey Publishing
- Address: Gallifrey Publishing, PO Box 155, Vicksburg, Michigan, 49097, USA
- Tel: (616) 649-3772
- Freq. Monthly (1st issue 1987?)
- ISSN #: 0893-1585
- Editor: Derek F. Stubbs
- Cost/Yr: $32 (USA, Canada), $48 (elsewhere)
- Remark: I only have one exemplar so I cannot give you much detail about
- the contents. It is a very small one (12 pages) but it has a lot
- of (short) information in it about e.g. conferences, books,
- (new) ideas etc. I don't think it is very expensive but I'm not sure.
- (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")
- ------
- Title: JNNS Newsletter (Newsletter of the Japan Neural Network Society)
- Publish: The Japan Neural Network Society
- Freq.: Quarterly (vol. 1 in 1989)
- Remark: (IN JAPANESE LANGUAGE) Official Newsletter of the Japan Neural
- Network Society(JNNS)
- (Note: remarks by Osamu Saito "saito@nttica.NTT.JP")
- -------
- Title: Neural Networks Today
- Remark: I found this title in a bulletin board of october last year.
- It was a message of Tim Pattison, timpatt@augean.OZ
- (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")
- -----
- Title: Computer Simulations in Brain Science
- -----
- Title: Internation Journal of Neuroscience
- -----
- Title: Neural Network Computation
- Remark: Possibly the same as "Neural Computation"
- -----
- Title: Neural Computing and Applications
- Freq.: Quarterly
- Publish: Springer Verlag
- Cost/yr: 120 Pounds
- Remark: Is the journal of the Neural Computing Applications Forum.
- Publishes original research and other information
- in the field of practical applications of neural computing.
-
- B. NN Related Journals
- ======================
-
- Title: Complex Systems
- Publish: Complex Systems Publications
- Address: Complex Systems Publications, Inc., P.O. Box 6149, Champaign,
- IL 61821-8149, USA
- Freq.: 6 times per year (1st volume is 1987)
- ISSN #: 0891-2513
- Cost/Yr: Individual $75, Institution $225
- Remark: Journal COMPLEX SYSTEMS devotes to the rapid publication of research
- on the science, mathematics, and engineering of systems with simple
- components but complex overall behavior. Send mail to
- "jcs@complex.ccsr.uiuc.edu" for additional info.
- (Remark is from announcement on Net)
- -----
- Title: Biological Cybernetics (Kybernetik)
- Publish: Springer Verlag
- Remark: Monthly (vol. 1 in 1961)
- -----
- Title: Various IEEE Transactions and Magazines
- Publish: IEEE
- Remark: Primarily see IEEE Trans. on System, Man and Cybernetics; Various
- Special Issues: April 1990 IEEE Control Systems Magazine.; May 1989
- IEEE Trans. Circuits and Systems.; July 1988 IEEE Trans. Acoust.
- Speech Signal Process.
- -----
- Title: The Journal of Experimental and Theoretical Artificial Intelligence
- Publish: Taylor & Francis, Ltd.
- Address: London, New York, Philadelphia
- Freq.: ? (1st issue Jan 1989)
- Remark: For submission information, please contact either of the editors:
- Eric Dietrich Chris Fields
- PACSS - Department of Philosophy Box 30001/3CRL
- SUNY Binghamton New Mexico State University
- Binghamton, NY 13901 Las Cruces, NM 88003-0001
- dietrich@bingvaxu.cc.binghamton.edu cfields@nmsu.edu
- -----
- Title: The Behavioral and Brain Sciences
- Publish: Cambridge University Press
- Remark: (Expensive as hell, I'm sure.)
- This is a delightful journal that encourages discussion on a
- variety of controversial topics. I have especially enjoyed reading
- some papers in there by Dana Ballard and Stephen Grossberg (separate
- papers, not collaborations) a few years back. They have a really neat
- concept: they get a paper, then invite a number of noted scientists
- in the field to praise it or trash it. They print these commentaries,
- and give the author(s) a chance to make a rebuttal or concurrence.
- Sometimes, as I'm sure you can imagine, things get pretty lively. I'm
- reasonably sure they are still at it--I think I saw them make a call
- for reviewers a few months ago. Their reviewers are called something
- like Behavioral and Brain Associates, and I believe they have to be
- nominated by current associates, and should be fairly well established
- in the field. That's probably more than I really know about it but
- maybe if you post it someone who knows more about it will correct any
- errors I have made. The main thing is that I liked the articles I
- read. (Note: remarks by Don Wunsch <dwunsch@blake.acs.washington.edu>)
- -----
- Title: International Journal of Applied Intelligence
- Publish: Kluwer Academic Publishers
- Remark: first issue in 1990(?)
- -----
- Title: Bulletin of Mathematica Biology
- -----
- Title: Intelligence
- -----
- Title: Journal of Mathematical Biology
- -----
- Title: Journal of Complex System
- -----
- Title: AI Expert
- Publish: Miller Freeman Publishing Co., for subscription call ++415-267-7672.
- Remark: Regularly includes ANN related articles, product
- announcements, and application reports.
- Listings of ANN programs are available on AI Expert affiliated BBS's
- -----
- Title: International Journal of Modern Physics C
- Publish: World Scientific Publ. Co.
- Farrer Rd. P.O.Box 128, Singapore 9128
- or: 687 Hartwell St., Teaneck, N.J. 07666 U.S.A
- or: 73 Lynton Mead, Totteridge, London N20 8DH, England
- Freq: published quarterly
- Eds: G. Fox, H. Herrmann and K. Kaneko
- -----
- Title: Machine Learning
- Publish: Kluwer Academic Publishers
- Address: Kluwer Academic Publishers
- P.O. Box 358
- Accord Station
- Hingham, MA 02018-0358 USA
- Freq.: Monthly (8 issues per year; increasing to 12 in 1993)
- Cost/Yr: Individual $140 (1992); Member of AAAI or CSCSI $88
- Remark: Description: Machine Learning is an international forum for
- research on computational approaches to learning. The journal
- publishes articles reporting substantive research results on a
- wide range of learning methods applied to a variety of task
- domains. The ideal paper will make a theoretical contribution
- supported by a computer implementation.
- The journal has published many key papers in learning theory,
- reinforcement learning, and decision tree methods. Recently
- it has published a special issue on connectionist approaches
- to symbolic reasoning. The journal regularly publishes
- issues devoted to genetic algorithms as well.
- -----
- Title: Journal of Physics A: Mathematical and General
- Publish: Inst. of Physics, Bristol
- Freq: 24 issues per year.
- Remark: Statistical mechanics aspects of neural networks
- (mostly Hopfield models).
-
- -----
- Title: Physical Review A: Atomic, Molecular and Optical Physics
- Publish: The American Physical Society (Am. Inst. of Physics)
- Freq: Monthly
- Remark: Statistical mechanics of neural networks.
-
-
- C. Journals loosely related to NNs
- ==================================
-
- JOURNAL OF COMPLEXITY
- (Must rank alongside Wolfram's Complex Systems)
-
- IEEE ASSP Magazine
- (April 1987 had the Lippmann intro. which everyone likes to cite)
-
- ARTIFICIAL INTELLIGENCE
- (Vol 40, September 1989 had the survey paper by Hinton)
-
- COGNITIVE SCIENCE
- (the Boltzmann machine paper by Ackley et al appeared here in Vol 9, 1983)
-
- COGNITION
- (Vol 28, March 1988 contained the Fodor and Pylyshyn critique of connectionism)
-
- COGNITIVE PSYCHOLOGY
- (no comment!)
-
- JOURNAL OF MATHEMATICAL PSYCHOLOGY
- (several good book reviews)
-
- ------------------------------------------------------------------------
-
- -A12.) The most important conferences concerned with Neural Networks ?
-
- [preliminary]
- [to be added: has taken place how often yet; most emphasized topics;
- where to get proceedings ]
-
- A. Dedicated Neural Network Conferences:
- 1. Neural Information Processing Systems (NIPS)
- Annually in Denver, Colorado; late November or early December
- 2. International Joint Conference on Neural Networks (IJCNN)
- co-sponsored by INNS and IEEE
- 3. Annual Conference on Neural Networks (ACNN)
- 4. International Conference on Artificial Neural Networks (ICANN)
- Annually in Europe(?), 1992 in Brighton
- Major conference of European Neur. Netw. Soc. (ENNS)
-
- B. Other Conferences
- 1. International Joint Conference on Artificial Intelligence (IJCAI)
- 2. Intern. Conf. on Acustics, Speech and Signal Processing (ICASSP)
- 3. Annual Conference of the Cognitive Science Society
- 4. [Vision Conferences?]
-
- C. Pointers to Conferences
- 1. The journal "Neural Networks" has a long list of conferences,
- workshops and meetings in each issue.
- This is quite interdisciplinary.
- 2. There is a regular posting on comp.ai.neural-nets from Paultje Bakker:
- "Upcoming Neural Network Conferences", which lists names, dates,
- locations, contacts, and deadlines.
-
- ------------------------------------------------------------------------
-
- -A13.) Neural Network Associations ?
-
- [Is this data still correct ? Who will send me some update ?]
-
- 1. International Neural Network Society (INNS).
- INNS membership includes subscription to "Neural Networks",
- the official journal of the society.
- Membership is $55 for non-students and $45 for students per year.
- Address: INNS Membership, P.O. Box 491166, Ft. Washington, MD 20749.
-
- 2. International Student Society for Neural Networks (ISSNNets).
- Membership is $5 per year.
- Address: ISSNNet, Inc., P.O. Box 15661, Boston, MA 02215 USA
-
- 3. Women In Neural Network Research and technology (WINNERS).
- Address: WINNERS, c/o Judith Dayhoff, 11141 Georgia Ave., Suite 206,
- Wheaton, MD 20902. Telephone: 301-933-9000.
-
- 4. European Neural Network Society (ENNS)
-
- 5. Japanese Neural Network Society (JNNS)
- Address: Japanese Neural Network Society
- Department of Engineering, Tamagawa University,
- 6-1-1, Tamagawa Gakuen, Machida City, Tokyo,
- 194 JAPAN
- Phone: +81 427 28 3457, Fax: +81 427 28 3597
-
- 6. Association des Connexionnistes en THese (ACTH)
- (the French Student Association for Neural Networks)
- Membership is 100 FF per year
- Activities : newsletter, conference (every year), list of members...
- Address : ACTH - Le Castelnau R2
- 23 avenue de la Galline
- 34170 Castelnau-le-Lez
- FRANCE
- Contact : jdmuller@vnet.ibm.com
-
- 7. Neurosciences et Sciences de l'Ingenieur (NSI)
- Biology & Computer Science
- Activity : conference (every year)
- Address : NSI - TIRF / INPG
- 46 avenue Felix Viallet
- 38031 Grenoble Cedex
- FRANCE
-
-
- ------------------------------------------------------------------------
-
- -A14.) Other sources of information about NNs ?
-
- 1. Neuron Digest
- Internet Mailing List. From the welcome blurb:
- "Neuron-Digest is a list (in digest form) dealing with all aspects
- of neural networks (and any type of network or neuromorphic system)"
- Moderated by Peter Marvit.
- To subscribe, send email to neuron-request@cattell.psych.upenn.edu
- comp.ai.neural-net readers also find the messages in that newsgroup
- in the form of digests.
-
- 2. Usenet groups comp.ai.neural-nets (Oha ! :-> )
- and comp.theory.self-org-sys
- There is a periodic posting on comp.ai.neural-nets sent by
- srctran@world.std.com (Gregory Aharonian) about Neural Network
- patents.
-
- 3. Central Neural System Electronic Bulletin Board
- Modem: 509-627-6CNS; Sysop: Wesley R. Elsberry;
- P.O. Box 1187, Richland, WA 99352; welsberr@sandbox.kenn.wa.us
- Available thrugh FidoNet, RBBS-Net, and other EchoMail compatible
- bulletin board systems as NEURAL_NET echo.
-
- 4. USENET newsgroup comp.org.issnnet
- Forum for discussion of academic/student-related issues in NNs, as
- well as information on ISSNNet (see A13) and its activities.
-
-
- ------------------------------------------------------------------------
-
-
-
- That's all folks.
-
- ========================================================================
-
- Acknowledgements: Thanks to all the people who helped to get the stuff
- above into the posting. I cannot name them all, because
- I would make far too many errors then. :->
-
- No ? Not good ? You want individual credit ?
- OK, OK. I'll try to name them all. But: no guarantee....
-
- THANKS FOR HELP TO:
- (in alphabetical order of email adresses, I hope)
-
- Allen Bonde <ab04@harvey.gte.com>
- Alexander Linden <al@jargon.gmd.de>
- S.Taimi Ames <ames@reed.edu>
- anderson@atc.boeing.com
- Kim L. Blackwell <avrama@helix.nih.gov>
- Paul Bakker <bakker@cs.uq.oz.au>
- Yijun Cai <caiy@mercury.cs.uregina.ca>
- L. Leon Campbell <campbell@brahms.udel.edu>
- David DeMers <demers@cs.ucsd.edu>
- Denni Rognvaldsson <denni@thep.lu.se>
- Wesley R. Elsberry <elsberry@cse.uta.edu>
- Frank Schnorrenberg <fs0997@easttexas.tamu.edu>
- Gary Lawrence Murphy <garym@maya.isis.org>
- gaudiano@park.bu.edu
- Glen Clark <opto!glen@gatech.edu>
- guy@minster.york.ac.uk
- Jean-Denis Muller <jdmuller@vnet.ibm.com>
- Jonathan Kamens <jik@MIT.Edu>
- Jon Gunnar Solheim <jon@kongle.idt.unit.no>
- Josef Nelissen <jonas@beor.informatik.rwth-aachen.de>
- Kjetil.Noervaag@idt.unit.no
- Luke Koops <koops@gaul.csd.uwo.ca>
- William Mackeown <mackeown@compsci.bristol.ac.uk>
- Peter Marvit <marvit@cattell.psych.upenn.edu>
- masud@worldbank.org
- Yoshiro Miyata <miyata@sccs.chukyo-u.ac.jp>
- Jyrki Alakuijala <more@ee.oulu.fi>
- mrs@kithrup.com
- Maciek Sitnik <msitnik@plearn.edu.pl>
- R. Steven Rainwater <ncc@ncc.jvnc.net>
- Michael Plonski <plonski@aero.org>
- [myself]
- Richard Andrew Miles Outerbridge <ramo@uvphys.phys.uvic.ca>
- Richard Cornelius <richc@rsf.atd.ucar.edu>
- Rob Cunningham <rkc@xn.ll.mit.edu>
- Osamu Saito <saito@nttica.ntt.jp>
- Ted Stockwell <ted@aps1.spa.umn.edu>
- Thomas G. Dietterich <tgd@research.cs.orst.edu>
- Thomas.Vogel@cl.cam.ac.uk
- Ulrich Wendl <uli@unido.informatik.uni-dortmund.de>
- Matthew P Wiener <weemba@sagi.wistar.upenn.edu>
-
- Bye
-
- Lutz
-
- --
- Lutz Prechelt (email: prechelt@ira.uka.de) | Whenever you
- Institut fuer Programmstrukturen und Datenorganisation | complicate things,
- Universitaet Karlsruhe; D-7500 Karlsruhe 1; Germany | they get
- (Voice: ++49/721/608-4068, FAX: ++49/721/694092) | less simple.
-
-
-